Convex Variational Formulations for Learning Problems
نویسنده
چکیده
In this article, we introduce new techniques to solve the nonlinear regression problem and the nonlinear classification problem. Our benchmarks suggest that our method for regression is significantly more effective when compared to classical methods and our method for classification is competitive. Our list of classical methods includes least squares, random forests, decision trees, boosted trees, nearest neighbors, logistic regression, SVMs and neural networks. These new techniques relie on convex variational formulations of the nonlinear regression and nonlinear classification problems. In the case of regression, we chose a function minimizing an energy functional plus the squared error of the predictions. In the case of classification, we chose a function minimizing an energy functional plus costs of misclassification. These convex variational formulations also provide information to perform dimensionality reduction and to study the dependencies between the variables of the problems. We also derive a notion of complexity for regression and classification problems. The method to find such minimizing functions turns out to be a simple quadratic optimization problem that can be solved efficiently. Here we present the methods in a way they can be easily understood by all practitioners without going into mathematical details. Keywords—Regression, Nonlinear Regression, Classification, Nonlinear Classification, Variational Formulations, Optimization, Quadratic Programming, Machine Learning
منابع مشابه
Strong convergence theorem for a class of multiple-sets split variational inequality problems in Hilbert spaces
In this paper, we introduce a new iterative algorithm for approximating a common solution of certain class of multiple-sets split variational inequality problems. The sequence of the proposed iterative algorithm is proved to converge strongly in Hilbert spaces. As application, we obtain some strong convergence results for some classes of multiple-sets split convex minimization problems.
متن کاملMerit Functions and Descent Algorithms for a Class of Variational Inequality Problems
We consider a variational inequality problem, where the cost mapping is the sum of a single-valued mapping and the subdifferential mapping of a convex function. For this problem we introduce a new class of equivalent optimization formulations; based on them, we also provide the first convergence analysis of descent algorithms for the problem. The optimization formulations constitute generalizat...
متن کاملTutorial on variational approximation methods
Tutorial topics • A bit of history • Examples of variational methods • A brief intro to graphical models • Variational mean field theory – Accuracy of variational mean field – Structured mean field theory • Variational methods in Bayesian estimation • Convex duality and variational factorization methods – Example: variational inference and the QMR-DT Variational methods • Classical setting: " f...
متن کاملVector Optimization Problems and Generalized Vector Variational-Like Inequalities
In this paper, some properties of pseudoinvex functions, defined by means of limiting subdifferential, are discussed. Furthermore, the Minty vector variational-like inequality, the Stampacchia vector variational-like inequality, and the weak formulations of these two inequalities defined by means of limiting subdifferential are studied. Moreover, some relationships between the vector vari...
متن کاملVariational inequalities on Hilbert $C^*$-modules
We introduce variational inequality problems on Hilbert $C^*$-modules and we prove several existence results for variational inequalities defined on closed convex sets. Then relation between variational inequalities, $C^*$-valued metric projection and fixed point theory on Hilbert $C^*$-modules is studied.
متن کامل